On the way towards a generalized entropy maximization procedure

نویسنده

  • G. Baris Bağcı
چکیده

We propose a generalized entropy maximization procedure, which takes into account the generalized averaging procedures and information gain definitions underlying the generalized entropies. This novel generalized procedure is then applied to Rényi and Tsallis entropies. The generalized entropy maximization procedure for Rényi entropies results in the exponential stationary distribution asymptotically for q ∈ [0, 1] in contrast to the stationary distribution of the inverse power law obtained through the ordinary entropy maximization procedure. Another result of the generalized entropy maximization procedure is that one can naturally obtain all the possible stationary distributions associated with the Tsallis entropies by employing either ordinary or q-generalized Fourier transforms in the averaging procedure.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Defaults and Infinitesimals Defeasible Inference by Nonarchimedean Entropy-Maximization

We develop a new semantics for defeasible infer­ ence based on extended probability measures al­ lowed to take infinitesimal values, on the inter­ pretation of defaults as generalized conditional probability constraints and on a preferred-model implementation of entropy-maximization.

متن کامل

Combined Minimum Relative Entropy and Maximum Likelihood estimation of dynamic models

This paper proposes an approach to estimating dynamic economic models that is based on a combination of relative entropy minimization (MRE) and log-likelihood maximization (MLE). It is assumed that there is not enough data to estimate it in the conventional manner, but that it is nevertheless required to fit the available data, given a priori knowledge regarding the stochastic distribution of t...

متن کامل

Generalized Entropy Concentration for Counts

We consider the phenomenon of entropy concentration under linear constraints in a discrete setting, using the “balls and bins” paradigm, but without the assumption that the number of balls allocated to the bins is known. Therefore instead of frequency vectors and ordinary entropy, we have count vectors with unknown sum, and a certain generalized entropy. We show that if the constraints bound th...

متن کامل

Tsallis ’ entropy maximization procedure revisited

The proper way of averaging is an important question with regards to Tsal-lis' Thermostatistics. Three different procedures have been thus far employed in the pertinent literature. The third one, i.e., the Tsallis-Mendes-Plastino (TMP) [1] normalization procedure, exhibits clear advantages with respect to earlier ones. In this work, we advance a distinct (from the TMP-one) way of handling the L...

متن کامل

Fuzzy clustering with the generalized entropy of feature weights

Fuzzy c-means (FCM) is an important clustering algorithm. However, it does not consider the impact of different feature on clustering. In this paper, we present a fuzzy clustering algorithm with the generalized entropy of feature weights FCM (GEWFCM). By introducing feature weights and adding regularized term of their generalized entropy, a new objective function is proposed in terms of objecti...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009